- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources6
- Resource Type
-
0006000000000000
- More
- Availability
-
60
- Author / Contributor
- Filter by Author / Creator
-
-
Liu, Daogao (6)
-
Lee, Yin Tat (3)
-
Tian, Kevin (3)
-
Asi, Hilal (2)
-
Gopi, Sivakanth (2)
-
Brown, Gavin (1)
-
Carmon, Yair (1)
-
Dvijotham, Krishnamurthy (1)
-
Evans, Georgina (1)
-
Jambulapati, Arun (1)
-
Jin, Yujia (1)
-
Lowy, Andrew (1)
-
Shen, Ruoqi (1)
-
Sidford, Aaron (1)
-
Smith, Adam (1)
-
Thakurta, Abhradeep (1)
-
#Tyler Phillips, Kenneth E. (0)
-
#Willis, Ciara (0)
-
& Abreu-Ramos, E. D. (0)
-
& Abramson, C. I. (0)
-
- Filter by Editor
-
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
& Spitzer, S.M. (0)
-
(submitted - in Review for IEEE ICASSP-2024) (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Asi, Hilal; Liu, Daogao; Tian, Kevin (, https://doi.org/10.48550/arXiv.2406.02789)This paper studies differentially private stochastic convex optimization (DP-SCO) in the presence of heavy-tailed gradients, where only a đ kth-moment bound on sample Lipschitz constants is assumed, instead of a uniform bound. The authors propose a reduction-based approach that achieves the first near-optimal error rates (up to logarithmic factors) in this setting. Specifically, under ( đ , đŋ ) (Īĩ,δ)-approximate differential privacy, they achieve an error bound of đē 2 đ + đē đ â ( đ đ đ ) 1 â 1 đ , n â G 2 â â +G k â â ( nĪĩ d â â ) 1â k 1 â , up to a mild polylogarithmic factor in 1 đŋ δ 1 â , where đē 2 G 2 â and đē đ G k â are the 2nd and đ kth moment bounds on sample Lipschitz constants. This nearly matches the lower bound established by Lowy and Razaviyayn (2023). Beyond the basic result, the authors introduce a suite of private algorithms that further improve performance under additional assumptions: an optimal algorithm under a known-Lipschitz constant, a near-linear time algorithm for smooth functions, and an optimal linear-time algorithm for smooth generalized linear models.more » « less
-
Brown, Gavin; Dvijotham, Krishnamurthy; Evans, Georgina; Liu, Daogao; Smith, Adam; Thakurta, Abhradeep (, International Conference on Machine Learning)
-
Carmon, Yair; Jambulapati, Arun; Jin, Yujia; Lee, Yin Tat; Liu, Daogao; Sidford, Aaron; Tian, Kevin (, IEEE)
-
Gopi, Sivakanth; Lee, Yin Tat; Liu, Daogao; Shen, Ruoqi; Tian, Kevin (, SODA 2023)
-
Gopi, Sivakanth; Lee, Yin Tat; Liu, Daogao (, COLT 2022)
An official website of the United States government

Full Text Available